32 research outputs found

    Governance of Offshore IT Outsourcing at Shell Global Functions IT-BAM Development and Application of a Governance Framework to Improve Outsourcing Relationships

    Get PDF
    The lack of effective IT governance is widely recognized as a key inhibitor to successful global IT outsourcing relationships. In this study we present the development and application of a governance framework to improve outsourcing relationships. The approach used to developing an IT governance framework includes a meta model and a customization process to fit the framework to the target organization. The IT governance framework consists of four different elements (1) organisational structures, (2) joint processes between in- and outsourcer, (3) responsibilities that link roles to processes and (4) a diverse set of control indicators to measure the success of the relationship. The IT governance framework is put in practice in Shell GFIT BAM, a part of Shell that concluded to have a lack of management control over at least one of their outsourcing relationships. In a workshop the governance framework was used to perform a gap analysis between the current and desired governance. Several gaps were identified in the way roles and responsibilities are assigned and joint processes are set-up. Moreover, this workshop also showed the usefulness and usability of the IT governance framework in structuring, providing input and managing stakeholders in the discussions around IT governance

    Gender differences in the use of cardiovascular interventions in HIV-positive persons; the D:A:D Study

    Get PDF
    Peer reviewe

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    Usability and Acceptance of Wearable Biosensors in Forensic Psychiatry: Cross-sectional Questionnaire Study

    Get PDF
    The use of wearable biosensor devices for monitoring and coaching in forensic psychiatric settings yields high expectations for improved self-regulation of emotions and behavior in clients and staff members. More so, if clients have mild intellectual disabilities (IQ 50-85), they might benefit from these biosensors as they are easy to use in everyday life, which ensures that clients can practice with the devices in multiple stress and arousal-inducing situations. However, research on (continuous) use and acceptance of biosensors in forensic psychiatry for clients with mild intellectual disabilities and their caretakers is scarce. Although wearable biosensors show promise for health care, recent research showed that the acceptance and continuous use of wearable devices in consumers is not as was anticipated, probably due to low expectations

    Associations of sympathetic and parasympathetic activity in job stress and burnout: A systematic review.

    Get PDF
    This systematic review examines the relationship between sympathetic and parasympathetic activity on the one hand and job stress and burnout on the other, and is registered at PROSPERO under CRD42016035918. BACKGROUND:Previous research has shown that prolonged job stress may lead to burnout, and that differences in heart rate variability are apparent in people who have heightened job stress. AIMS:In this systematic review, the associations between job stress or burnout and heart rate (variability) or skin conductance are studied. Besides, it was investigated which-if any-guidelines are available for ambulatory assessment and reporting of the results. METHODS:We extracted data from relevant databases following the PRESS checklist and contacted authors for additional resources. Participants included the employed adult population comparing validated job stress and burnout questionnaires examining heart rate and electrodermal activity. Synthesis followed the PRISMA guidelines of reporting systematic reviews. RESULTS:The results showed a positive association between job stress and heart rate, and a negative association between job stress and heart rate variability measures. No definite conclusion could be drawn with regard to burnout and psychophysiological measures. No studies on electrodermal activity could be included based on the inclusion criteria. CONCLUSIONS:High levels of job stress are associated with an increased heart rate, and decreased heart rate variability measures. Recommendations for ambulatory assessment and reporting (STROBE) are discussed in light of the findings

    Future response of the Wadden Sea tidal basins to relative sea-level rise — an aggregated modelling approach

    No full text
    Climate change, and especially the associated acceleration of sea-level rise, forms a serious threat to the Wadden Sea. The Wadden Sea contains the world’s largest coherent intertidal flat area and it is known that these flats can drown when the rate of sea-level rise exceeds a critical limit. As a result, the intertidal flats would then be permanently inundated, seriously affecting the ecological functioning of the system. The determination of this critical limit and the modelling of the transient process of how a tidal basin responds to accelerated sea-level rise is of critical importance. In this contribution we revisit the modelling of the response of the Wadden Sea tidal basins to sea-level rise using a basin scale morphological model (aggregated scale morphological interaction between tidal basin and adjacent coast, ASMITA). Analysis using this aggregated scale model shows that the critical rate of sea-level rise is not merely influenced by the morphological equilibrium and the morphological time scale, but also depends on the grain size distribution of sediment in the tidal inlet system. As sea-level rises, there is a lag in the morphological response, which means that the basin will be deeper than the systems morphological equilibrium. However, so long as the rate of sea-level rise is constant and below a critical limit, this offset becomes constant and a dynamic equilibrium is established. This equilibrium deviation as well as the time needed to achieve the dynamic equilibrium increase non-linearly with increasing rates of sea-level rise. As a result, the response of a tidal basin to relatively fast sea-level rise is similar, no matter if the sea-level rise rate is just below, equal or above the critical limit. A tidal basin will experience a long process of ‘drowning’ when sea-level rise rate exceeds about 80% of the critical limit. The insights from the present study can be used to improve morphodynamic modelling of tidal basin response to accelerating sea-level rise and are useful for sustainable management of tidal inlet systems
    corecore